Convergence of nonparametric estimators for a regression function
نویسنده
چکیده
In this paper we prove the asymptotic normality and rates of strong convergence of some types of estimators for the regression function in a xed-design regression model. We consider the Gasser-MMller estimator and the Priestley-Chao estimator (univariate and multivariate). The proofs of asymptotic normality are based on a central limit theorem from an earlier paper by the author (1996, Stochastics and Stochastics Reports, 59, pp.241-258). We prove that we can achieve the optimal rate of strong convergence which is known from regression estimation in the independent data case.
منابع مشابه
Wavelets for Nonparametric Stochastic Regression with Pairwise Negative Quadrant Dependent Random Variables
We propose a wavelet based stochastic regression function estimator for the estimation of the regression function for a sequence of pairwise negative quadrant dependent random variables with a common one-dimensional probability density function. Some asymptotic properties of the proposed estimator are investigated. It is found that the estimators have similar properties to their counterparts st...
متن کاملOn the Minimax Optimality of Block Thresholded Wavelets Estimators for ?-Mixing Process
We propose a wavelet based regression function estimator for the estimation of the regression function for a sequence of ?-missing random variables with a common one-dimensional probability density function. Some asymptotic properties of the proposed estimator based on block thresholding are investigated. It is found that the estimators achieve optimal minimax convergence rates over large class...
متن کاملNonparametric regression estimation on closed Riemannian manifolds
The nonparametric estimation of the regression function of a real-valued random variable Y on a random object X valued in a closed Riemannian manifold M is considered. A regression estimator which generalizes kernel regression estimators on Euclidean sample spaces is introduced. Under classical assumptions on the kernel and the bandwidth sequence, the asymptotic bias and variance are obtained, ...
متن کاملNonparametric Regression with Errors in Variables
The effect of errors in variables in nonparametric regression estimation is examined. To account for errors in covariates, deconvolution is involved in the construction of a new class of kernel estimators. It is shown that optima/local and global rates of convergence of these kernel estimators can be characterized by the tail behavior of the characteristic function of the error distribution. In...
متن کاملVariance estimation in nonparametric regression via the difference sequence method (short title: Sequence-based variance estimation)
Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of difference-based kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwidths are fully characterized, and asymptotic normality is also established. We also show that for ...
متن کاملVariance Estimation in Nonparametric Regression via the Difference Sequence Method by Lawrence
Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of difference-based kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwidths are fully characterized, and asymptotic normality is also established. We also show that for ...
متن کامل